Expectation is a location measure which give the location of the center of a random variable.
for discrete random variables, the expectation is E(x)=∑x∈RxP(X=x). Let X, Y be discrete random variables. Let a,b∈R be constants. Let Z=aX+bY, then E(Z)=aE(X)+bE(Y)
for continuous random variables, the expectation is E(x)=∫x∈Rxf(x). Let X, Y be discrete random variables. Let a,b∈R be constants. Let Z=aX+bY, then E(Z)=aE(X)+bE(Y)
If X,Y are independent random variables, then E(XY)=E(X)E(Y)
If there exists some other function g, then the expectation of g is:
E(g(x))={∫−∞∞g(x)f(x)dx∑x∈Rg(x)P(X=x)x is continuousx is discrete
More complicated, for the joint situation, there exists such function h(x,y):R2→RE(h(x,y))={∫−∞∞∫−∞∞h(x,y)f(x,y)dxdy∑y∈R∑x∈Rh(x,y)P(X=x,Y=y)x is continuousx is discrete
E(X)=∫0∞P(X>t)dt−∫−∞0P(X<t)dt where both probability <∞
Linearity property of expectations only work finite where : ∑i=1nE[Xi]=E[∑i=1nXi]. When it approaches to infinite we have conditions to make it work.
E[Y12−2Y1Y2+Y22]=E[Y12]−2E[Y1]E[Y2]+E[Y22]=Var[Y1]+(E[Y1])2+Var[Y2]+(E[Y2])2−2E[Y1]E[Y2]=Var[Y1]+Var[Y2]+(E[Y1]−E[Y2])2=2Var[Y] where two random variables from the same sample space, they should have the same variance and mean value.
Correlation: we can calculate the correlation between two random variables by the formula Corr(X,Y)=Var[X]Var[Y]Cov(X,Y)
Let X be discrete random variable, we define the Probability Generating Function(PGF)rX(t)=E(tX);t∈R. If given some PGF, e.g. rX(0)=P(X=0), then we can have
rX′(0)=P(X=1)
rX′′(0)=2P(X=2)
rXk(0)=k!P(X=k)
For any random variable X, we have Moment Generating Function(MGF) which defined as the raw moment given that the expectation exists: mX(s)=E[esX];s∈R. Assume mX(s)<∞,s∈(−s0,s0),s0>0 which leads to:
mX(0)=1
mX′(0)=E[X]
mX′′(0)=E[X2]
mXk(0)=E[Xk]
Let X,Y be two independent random variables with MGFs mX(t),mY(t), then the MGF of X+Y=mX+Y(t)=E[eX+Yt]=E[eYt]E[eXt]=mX(t)mY(t). Result is the same as more independent random variables, and we can use the existed MGF to determine the distribution of those random variables.
for s∈(−s0,s0),s0>0, mY(s)=mX(s)⟹X,Y have the same distribution.
Let X,Y be independent random variables. Let Z=aX+bY,mZ(t)=E[eZt]=E[eaXt]E[ebYt]=mX(at)mY(bt)
if X is constant, then we have mZ(t)=E[eat]mY(bt)=eatmY(bt)
Let X be discrete random variable, let A be some event with P(A)>0. The conditional expectation of X given by A is equal to E[X=x∣A]=∑x∈RxP(A)P(X=x,A). For condition expectation given by a joint random variable Y, if Y also discrete and P(Y=y)>0, then E[X=x∣Y=y]=∑x∈RxP(Y=y)P(X=x,Y=y)
Let X and Y be joint absolutely continuous random variable with joint density function fX,Y(x,y), the conditional expectation of X given Y=y is equal to E[X=x∣Y=y]=∫x∈Rxf(x∣Y=y)dx=∫x∈RxfY(y)fX,Y(x,y)dx
Let X,Y,W be random variables, let Z=aX+bY, let A be an event, then:
E[Z∣A]=E[aX+bY∣A]=aE[X∣A]+bE[Y∣A]
E[Z∣W]=aE[X∣W]+bE[Y∣W]
We also define total expectation for joint random variable X,Y as EY[E[X∣Y]]=E[X] and EX[E[Y∣X]]=E[Y]